Thick End of the Wedg

A simulation of $\large{e}$ - expectation of the number of "rolls" to escape 1¶

This is an example from Fermat's library.

On average, if you select numbers uniformly from $[0, 1]$ how many will it take to sum to $> 1$?

The answer is $\large{e}$ $\approx 2.718 \dots$

The simulation is straightforward and the proof is quite clever.

1. Simulation¶

In [1]:
function meancount(N)
    t = 0
    for i in 1:N
        s = 0.0
        while s <= 1.0
            s += rand()
            t += 1
        end
    end
    return t / N
end;

Julia is very strong at simulations so can run a big number quite quickly. On my machine running 100 million simulations takes less than 5 seconds.

In [4]:
@time meancount(100_000_000)
  4.813694 seconds (5 allocations: 176 bytes)
Out[4]:
2.71815012

2. Proof¶

Some notation:

$N_x$ is the number of values selected from $[0,1]$ that you need to sum to greater than $x$. $m_x = \mathbb{E}(N_x)$ i.e. $m_x$ is the expected number of selections to sum to greater than $x$. And lastly, $s$ is the first value selected (from $[0,1]$).


We want $m_1$.


Now $$ \begin{cases} \text{if } s>1 & N_1 = 1 \hspace{2em} \text{(But impossible since $s \in [0,1]$.)} \hspace{22em} \\[0.8em] \text{if } s<=1 & N_1 = 1 + N_{1-s} \end{cases} $$


Then, since $s$ takes any value from $0$ to $1$ we have:

\begin{aligned} \mathbb{E}(N_1) &= 1 + \int_0^1 \mathbb{E}(N_{1-s}) \; ds \\[1em] &= 1 + \int_0^1 \mathbb{E}(N_s) \; ds \ \ \ \ \ \ \ \ \ \ (s<1)\\[1em] m_1 &= 1 + \int_0^1 m_s \; ds \\[1em] \end{aligned}


And then it's easy to verify that $m_x = e\,^x$ is a solution:

\begin{aligned} m_1 &= 1 + \Big[e^s \Big]_0^1 \\[1em] &= 1 + e\,^1 - 1 \\[1em] &= e \\[1em] \end{aligned}
In [ ]: